Convexity, Detection, and, Generalized f-divergences

نویسندگان

  • John Duchi
  • Khashayar Khosravi
  • Feng Ruan
چکیده

The goal of multi-class classification problem is to find a discriminant function that minimizes the expectation of 0-1 loss function. However, minimizing 0-1 loss directly is often computationally intractable and practical algorithms are usually based on convex relaxations of 0-1 loss, say Φ, which is called the surrogate loss. In many applications, the covariates are either not available directly but are received after passing through a ‘dimension reducing’ quantizer or are deliberately transformed using a ‘feature selection’ stage to achieve a better interpretation. In experimental design, one fundamental question is that how to select the a quantization procedure that decreases our optimal Φ risk, RΦ(Q) the most. Another fundamental question here is to find all the surrogate losses Φ that are universally equivalent to the 0-1 loss functions, in the sense that, Φ and 0-1 loss induce the same ordering on the optimal risk of quantizers.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalizing Jensen and Bregman divergences with comparative convexity and the statistical Bhattacharyya distances with comparable means

Comparative convexity is a generalization of convexity relying on abstract notions of means. We define the (skew) Jensen divergence and the Jensen diversity from the viewpoint of comparative convexity, and show how to obtain the generalized Bregman divergences as limit cases of skewed Jensen divergences. In particular, we report explicit formula of these generalized Bregman divergences when con...

متن کامل

Quantum f-divergences and error correction

Quantum f -divergences are a quantum generalization of the classical notion of f divergences, and are a special case of Petz’ quasi-entropies. Many well-known distinguishability measures of quantum states are given by, or derived from, f -divergences; special examples include the quantum relative entropy, the Rényi relative entropies, and the Chernoff and Hoeffding measures. Here we show that t...

متن کامل

Risk-Based Generalizations of f-divergences

We derive a generalized notion of f divergences, called (f, l)-divergences. We show that this generalization enjoys many of the nice properties of f -divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergenc...

متن کامل

Schur Power Convexity of the Daróczy Means

In this paper, the Schur convexity is generalized to Schur f -convexity, which contains the Schur geometrical convexity, harmonic convexity and so on. When f : R+ →R is defined by f (x) = (xm−1)/m if m = 0 and f (x) = lnx if m = 0 , the necessary and sufficient conditions for f -convexity (is called Schur m -power convexity) of Daróczy means are given, which improve, generalize and unify Shi et...

متن کامل

On Convex Generalized Systems

In the paper, the set-convexity and mapping-convexity properties of the extended images of generalized systems are considered. By using these image properties and the tools of topological linear spaces, separation schemes ensuring the impossibility of generalized systems are developed; then, special problem classes are investigated.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015